Moment Bounds and Mean Squared Prediction Errors of Long-memory Time Series1 by Ngai

نویسندگان

  • HANG CHAN
  • SHIH-FENG HUANG
چکیده

A moment bound for the normalized conditional-sum-of-squares (CSS) estimate of a general autoregressive fractionally integrated moving average (ARFIMA) model with an arbitrary unknown memory parameter is derived in this paper. To achieve this goal, a uniform moment bound for the inverse of the normalized objective function is established. An important application of these results is to establish asymptotic expressions for the one-step and multi-step mean squared prediction errors (MSPE) of the CSS predictor. These asymptotic expressions not only explicitly demonstrate how the multistep MSPE of the CSS predictor manifests with the model complexity and the dependent structure, but also offer means to compare the performance of the CSS predictor with the least squares (LS) predictor for integrated autoregressive models. It turns out that the CSS predictor can gain substantial advantage over the LS predictor when the integration order is high. Numerical findings are also conducted to illustrate the theoretical results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Uniform Moment Bounds of Fisher’s Information with Applications to Time Series by Ngai

In this paper, a uniform (over some parameter space) moment bound for the inverse of Fisher’s information matrix is established. This result is then applied to develop moment bounds for the normalized least squares estimate in (nonlinear) stochastic regression models. The usefulness of these results is illustrated using time series models. In particular, an asymptotic expression for the mean sq...

متن کامل

Prediction of long memory processes on same-realisation

For the class of stationary Gaussian long memory processes, we study some properties of the leastsquares predictor of Xn+1 based on (Xn, . . . , X1). The predictor is obtained by projecting Xn+1 onto the finite past and the coefficients of the predictor are estimated on the same realisation. First we prove moment bounds for the inverse of the empirical covariance matrix. Then we deduce an asymp...

متن کامل

Inverse moment bounds for sample autocovariance matrices based on detrended time series and their applications

In this paper, we assume that observations are generated by a linear regression model with shortor long-memory dependent errors. We establish inverse moment bounds for kn-dimensional sample autocovariance matrices based on the least squares residuals (also known as the detrended time series), where kn n, kn → ∞ and n is the sample size. These results are then used to derive the mean-square erro...

متن کامل

Predictor selection for positive autoregressive processes

Let observations y1, · · · , yn be generated from a first-order autoregressive (AR) model with positive errors. In both the stationary and unit root cases, we derive moment bounds and limiting distributions of an extreme value estimator, ρ̂n, of the AR coefficient. These results enable us to provide asymptotic expressions for the mean squared error (MSE) of ρ̂n and the mean squared prediction err...

متن کامل

Linear Prediction of Long-Range Dependent Time Series

We present two approaches for next step linear prediction of long memory time series. The first is based on the truncation of the Wiener-Kolmogorov predictor by restricting the observations to the last k terms, which are the only available values in practice. Part of the mean squared prediction error comes from the truncation, and another part comes from the parametric estimation of the paramet...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013